Web Survey Bibliography
Social desirability bias (SDB) in surveys refers to systematic error resulting from the desire of respondents to avoid embarrassment and project a favorable image to others. Respondents may choose responses they believe are more socially desirable or acceptable rather than choosing responses that are reflective of their true thoughts or feelings. This may occur more often by some respondents than others (this tendency is believed to be a personality trait based on the subject's need for approval), for some questions rather than others (experience show that is occurs more often when asking about voting behavior, behavior related to addiction, crimes, illnesses, sexual behavior, charity, financial matters, and being a well informed and cultured person) or for interviewer-administered rather than self-administered survey modes. For web surveys previous studies have shown that the problem of SDB is smaller than in other survey modes due to self-administration and perceived anonymity of the response process. Nevertheless, the presence of questions sensitive to SDB may negatively influence data quality in web surveys as well.
With our paper we will show in which cases the presence of questions sensitive to SDB negatively influences data quality. For this purpose we will study the behaviour of respondents in a sample of 50 surveys selected out of more than 2,000 web surveys, hosted by a web survey service at our University. We will explore how the following factors influence different indicators of data quality (item nonresponse, survey brake-off and response patterns):
- presence of questions sensitive to SDB (share among all questions, position in the questionnaire),
- reassuring subjects that their responses will be kept confidential or anonymous in the invitation page,
- characteristics of respondents.
The analysis will be performed at three levels – level of individual questions, level of questionnaire, level of respondent.
Studying several surveys with a meta-analytical approach and performing a multi-level statistical analysis is relatively rare approach when studying SDB. It provides precious insight into how presence of questions subject to SDB influences the behaviour. Based on that, new knowledge is generated about tailoring the questionnaire design and developing survey questions so to minimize nonresponse and social-desirability of answers.
Conference Homepage (abstract)
Web survey bibliography (4086)
- Media tracker; 2012
- Measuring the quality of governmental websites in a controlled versus an online setting with the ‘...; 2012; Elling, S., Lentz, L., de Jong , M., van den Bergh, H.
- Measuring modern media consumption; 2012; Arini, N.
- ISO 20252. Market, opinion and social research-Vocabulary and service requirements, 2nd Edition; 2012
- Is „chapterisation“ a viable alternative to traditional progress indicators ?; 2012; Spicer, R., Dowling, Z.
- Internet use in households and by individual in 2012. Eurostat Statistics in Focus 50/2012; 2012; Seybert, H.
- Internet access - Households and individuals, 2012 part 2; 2012
- Internet access - Households and individuals, 2012; 2012
- Google et Médiamétrie créent une audience bimédia; 2012; Gonzales, P.
- GMI Pinnacle; 2012
- Global market research 2012; 2012
- Explaining rising nonresponse rates in cross-sectional surveys; 2012; Brick, J. M., Williams, Do.
- Eurobarometer Special surveys: Special Eurobarometer 381; 2012
- Online Surveys 2.0; 2012; Elferink, R.
- The Impact of Academic Sponsorship on Online Survey Dropout Rates; 2012; Allen, P. J., Roberts, L. D.
- Especially for You: Motivating Respondents in an Internet Panel by Offering Tailored Questions; 2012; Oudejans, M.
- Social media as a data collection tool: the impact of Facebook in behavioural research; 2012; Zoppos, E.
- Smartphone Apps and User Engagement: Collecting Data in the Digital Era; 2012; Link, M. W.
- Snowball Sampling in Online Social Networks; 2012; Raissi, M., Ackland, R.
- The Use of Facebook as a Locating and Contacting Tool; 2012; McCarthy, T.
- How Often Do You Use the App with a Bird on It? Exploring Differences in Survey Completion Times, Primacy...; 2012; Buskirk, T. D.
- Data quality of questions sensitive to social-desirability bias in web surveys; 2012; Lozar Manfreda, K., Zajc, N., Berzelak, N., Vehovar, V.
- Online Questionnaires: Development of ‘basic requirements’; 2012; Tries, S., Blanke, K.
- Social research in online context: methodological reflections on web surveys from a case study; 2012; Pandolfini, V.
- Efficacy of a health-related Facebook social network site on health-seeking behaviors; 2012; Woolley, P., Peterson, M.
- The war against unengaged online respondents; 2012; Gittelman, S. H., Trimarchi, E.
- Qualitatively Speaking: The five absolute, no-excuse must-dos for online qualitative researchers; 2012; Rossow, A.
- By the Numbers: Lessons for using online panels in B2B research; 2012; Elsner, N.
- Specialized Tools for Measuring Past Events ; 2012; Belli, R. F.
- Transparency, Access and the Credibility of Survey Research; 2012; Lupia, A.
- Can Microtargeting Improve Survey Sampling? An Assessment of Accuracy and Bias in Consumer File Marketing...; 2012; Pasek, J.
- Anonymity and Confidentiality; 2012; Tourangeau, R.
- Cognitive Evaluation of Survey Instruments: State of the Science (Art?) and Future Directions; 2012; Willis, G. B.
- Oh, Just One More Thing … Leveraging “Leave-Behinds” in Data Collection; 2012; Link, M. W.
- Paradata; 2012; Kreuter, F.
- Computation of Survey Weights: Bridging Theory and Practice; 2012; DeBell, M.
- Optimizing Response Rates; 2012; Brick, J. M.
- Modes of Data Collection; 2012; Tourangeau, R.
- The Use and Effects of Incentives in Surveys; 2012; Singer, E.
- Improving Question Design to Maximize Reliability and Validity; 2012; Krosnick, J. A.
- Respondent Attrition vs Data Attrition and Their Reduction; 2012; Olsen, R. J.
- Survey Interviewing: Deviations from the Script; 2012; Schaeffer, N. C.
- How accurate are surveys of objective phenomena?; 2012; Chang, L. C., Krosnick, J. A.
- Measure the response burden in the Swedish Intrastat system; 2012; Weideskog, F.
- Mode and non-response effects and their treatment; 2012; Chrysanthopoulos, S., Georgostathi, A.
- What can be said about quality in the Central Population Register based on a self-completion survey...; 2012; Falnes-Dalheim, E., Pedersen, H. E.
- Improving the quality of complex surveys: The case of the EU Labour Force Survey ; 2012; van der Valk, J.
- Pros and cons of Internet based User Satisfaction Surveys; 2012; Consoli, A., Matsulevits, L.
- Between demand and reality: Ensuring efficiency and quality in pretesting questionnaires; 2012; Sattelberger, S., Blanke, K.
- How to provide high data quality in online-questionnaires: Setting guidelines in design; 2012; Tries, S., Nebel, S., Blanke, K.